The phrase small-for-size (SFS) graft came into use in clinical liver transplantation in the 1980s. There had already been discussions about graft size mismatching in liver transplantation in the era of whole or split liver transplantation, in which initially right trisegments were used for adults.1 Interest in the issue flourished when living donor liver transplantation (LDLT) was extended to adult patients. Controversies included the pathogenesis and clinical manifestations of SFS grafts and strategies for successful SFS graft transplantation. On the other hand, the definition of SFS graft has always been ambiguous, and it has been defined tentatively in respective studies by the graft weight relative to the body weight or to the estimated whole liver weight (a value derived from the body size). Furthermore, the phrase SFS syndrome, which is also ambiguous and is often defined arbitrarily, is widely used to define SFS graft (and vice versa). One of the early clinical studies attempting to elucidate the impact of graft size mismatching in LDLT has been frequently cited in later articles on the topic: it has been cited 350 times for 11 years according to the Web of Science.2 This study used 0.8%, 1.0%, 3.0%, and 5.0% of the recipient body weight as graft size category borders to show the prognostic impact. However, the study design had several pitfalls. Children younger than 15 years composed the vast majority of the study population; older patients represented only 10%. The graft sizes encountered in pediatric LDLT are highly variable, in contrast to adult LDLT, in which the range is relatively narrow (0.5%-1.5% of the body weight). Furthermore, the body weight of patients with end-stage liver disease fluctuates with the occurrence of ascites or edema or the use of diuretics. Even the actual graft weight is potentially influenced by the presence of blood or preservation solution with a high viscosity. Hence, it was uncertain whether the impact of small differences in the graft weight was not masked by the wide range of variables in the recipients, donors, and surgical techniques in adult LDLT. Actually, multiple successive clinical studies of adult LDLT have failed to demonstrate a negative prognostic impact of small differences in the graft weight, although the concept of SFS graft or syndrome has survived.3-5 This is partly because most clinical studies are performed in an intent-to-treat fashion; that is, a clinical practice is coupled with appropriate safety measures for graft selection and technical modifications, including outflow maximization and inflow modification. Many confounding factors potentially interfere with the true negative effect of SFS grafts. In this context, Moon et al.6 should be congratulated for their article. They have shown that older donor age, presumably accompanied by low adaptability of the graft tissue, affects graft prognosis only when it is combined with an SFS graft. Unfortunately, there remains some room for discussion concerning their analytical method. They censored patient death due to pneumonia as unrelated to graft function and included graft failure due to surgical complications or disease recurrence as potential results of graft function. The definition of graft loss censored for analysis is often key for survival statistics, but it is quite difficult and remains an unresolved issue; a similar problem is shared by other studies. The authors' conclusion is in accordance with previous studies suggesting that specific factors can exacerbate the SFS graft effect (eg, disease severity, portal hypertension, donor age, graft steatosis, and ischemic injury).7-11 In other words, the adaptability of SFS grafts is defined by factors other than the actual graft weight. Experimental and clinical evidence suggests that elevated portal pressure and portal overperfusion form the central pathogenesis of the sequelae of SFS grafts.12-15 A persistent elevation of portal pressure and the resultant hyperperfusion of the graft, which are attributable to hyperdynamic splanchnic circulation and limited accommodation of the graft, cause portal venular and sinusoidal endothelial injury and the release of deleterious mediators, which ultimately lead to serious graft injury. However, such hyperdynamic portal circulation is a common phenomenon by nature in the early phase after partial liver transplantation, and it lasts for several months until final accommodation occurs.16 In addition, sufficient portal flow is a prerequisite for the recovery of graft function and graft regeneration.17 Once a disruption in the balance between portal hyperperfusion and graft accommodation occurs or once hyperperfusion reaches some threshold, portal flow becomes harmful to the graft. Presumably, graft compliance with portal hemodynamics or adaptability to portal hemodynamics is a key to define SFS graft sequelae in partial liver transplantation. Factors regulating such compliance or flexibility of the graft would be those potentially defining graft parenchymal quality, such as donor age, steatosis, and ischemic injury. Other unknown factors may also precipitate SFS graft sequelae. In parallel with the investigation into the pathogenesis of SFS syndrome, many surgical strategies have been invented to overcome SFS graft sequelae. Initial efforts were aimed at directly increasing the graft volume by auxiliary transplantation,18 dual grafts,19 and finally conversion from left liver grafts to right liver grafts.20 The second wave of efforts attempted to maximize the graft outflow by the inclusion or reconstruction of hepatic vein tributaries21 and by various refinements of the anastomotic technique. The third wave, which is now becoming prevalent in some adult LDLT programs, involves modification of the portal inflow to the graft. The techniques include splenic artery ligation or embolization,14 splenectomy,22 and partial portosystemic shunting,23 all of which are aimed at alleviating elevated portal pressure and overperfusion. Interestingly, there seems to be a reverse trend from the use of right liver grafts to left liver grafts because of the achievement of these inflow modifications. These techniques, however, are all double-edged swords, in that they carry the potential risk of reducing portal flow to a level hampering functional recovery and regeneration of the graft. In addition, the critical period of graft injury by overperfusion of an SFS graft is limited. Permanent or irreversible modification potentially alters the graft status in the chronic phase; the short-term result does not necessarily justify the long-term means. What is an ideal (ie, precise and reproducible) criterion for surgical intervention in SFS grafts? Is it the estimated graft volume or the actual graft weight? Liver size may vary even among healthy individuals and potentially depends on age, gender, race, nutritional or habitual status, era, and other unknown factors. The estimated graft volume is only a rough marker of the graft size. The real graft size with respect to the recipient condition in the early phase of transplantation is determined by recipient demand and the adaptability of the graft to the insults of surgery and ischemia/reperfusion, both of which may be affected by many unknown factors. The portal pressure or flow volume after reperfusion, if it is determined very strictly, is a candidate for indices of graft compliance with portal hemodynamics or adaptability to deleterious effects of portal hemodynamics. However, our knowledge concerning the process of dynamic changes in graft adaptation and its safety limits is still insufficient. All these points are potential targets of future research. We should always ask ourselves whether we are scalded dogs fearing cold water (and vice versa). LDLT, living donor liver transplantation; SFS, small for size.